Semi-Amortized Variational Autoencoders

نویسندگان

  • Yoon Kim
  • Sam Wiseman
  • Andrew C. Miller
  • David Sontag
  • Alexander M. Rush
چکیده

Amortized variational inference (AVI) replaces instance-specific local inference with a global inference network. While AVI has enabled efficient training of deep generative models such as variational autoencoders (VAE), recent empirical work suggests that inference networks can produce suboptimal variational parameters. We propose a hybrid approach, to use AVI to initialize the variational parameters and run stochastic variational inference (SVI) to refine them. Crucially, the local SVI procedure is itself differentiable, so the inference network and generative model can be trained end-to-end with gradient-based optimization. This semi-amortized approach enables the use of rich generative models without experiencing the posterior-collapse phenomenon common in training VAEs for problems like text generation. Experiments show this approach outperforms strong autoregressive and variational baselines on standard text and image datasets.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Inference Suboptimality in Variational Autoencoders

Amortized inference has led to efficient approximate inference for large datasets. The quality of posterior inference is largely determined by two factors: a) the ability of the variational distribution to model the true posterior and b) the capacity of the recognition network to generalize inference over all datapoints. We analyze approximate inference in variational autoencoders in terms of t...

متن کامل

InfoVAE: Information Maximizing Variational Autoencoders

It has been previously observed that variational autoencoders tend to ignore the latent code when combined with a decoding distribution that is too flexible. This undermines the purpose of unsupervised representation learning. In this paper, we additionally show that existing training criteria can lead to extremely poor amortized inference distributions and overestimation of the posterior varia...

متن کامل

Amortized Variational Compressive Sensing

The goal of statistical compressive sensing is to efficiently acquire and reconstruct high-dimensional signals with much fewer measurements, given access to a finite set of training signals from the underlying domain being sensed. We present a novel algorithmic framework based on autoencoders that jointly learns the acquisition (a.k.a. encoding) and recovery (a.k.a. decoding) functions while im...

متن کامل

Inducing Interpretable Representations with Variational Autoencoders

We develop a framework for incorporating structured graphical models in the encoders of variational autoencoders (VAEs) that allows us to induce interpretable representations through approximate variational inference. This allows us to both perform reasoning (e.g. classification) under the structural constraints of a given graphical model, and use deep generative models to deal with messy, high...

متن کامل

Coarse Grained Exponential Variational Autoencoders

Variational autoencoders (VAE) often use Gaussian or category distribution to model the inference process. This puts a limit on variational learning because this simplified assumption does not match the true posterior distribution, which is usually much more sophisticated. To break this limitation and apply arbitrary parametric distribution during inference, this paper derives a semi-continuous...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1802.02550  شماره 

صفحات  -

تاریخ انتشار 2018